Search results for "rejection sampling"

showing 10 items of 14 documents

Avoiding Boundary Effects in Wang-Landau Sampling

2003

A simple modification of the ``Wang-Landau sampling'' algorithm removes the systematic error that occurs at the boundary of the range of energy over which the random walk takes place in the original algorithm.

Heterogeneous random walk in one dimensionStatistical Mechanics (cond-mat.stat-mech)Rejection samplingFOS: Physical sciencesSlice samplingSampling (statistics)Boundary (topology)Random walk01 natural sciences010305 fluids & plasmasCombinatorics0103 physical sciencesRange (statistics)Applied mathematics010306 general physicsEnergy (signal processing)Condensed Matter - Statistical MechanicsMathematics
researchProduct

On the stability and ergodicity of adaptive scaling Metropolis algorithms

2011

The stability and ergodicity properties of two adaptive random walk Metropolis algorithms are considered. The both algorithms adjust the scaling of the proposal distribution continuously based on the observed acceptance probability. Unlike the previously proposed forms of the algorithms, the adapted scaling parameter is not constrained within a predefined compact interval. The first algorithm is based on scale adaptation only, while the second one incorporates also covariance adaptation. A strong law of large numbers is shown to hold assuming that the target density is smooth enough and has either compact support or super-exponentially decaying tails.

Statistics and ProbabilityStochastic approximationMathematics - Statistics TheoryStatistics Theory (math.ST)Law of large numbersMultiple-try Metropolis01 natural sciencesStability (probability)010104 statistics & probabilityModelling and Simulation65C40 60J27 93E15 93E35Adaptive Markov chain Monte CarloFOS: Mathematics0101 mathematicsScalingMetropolis algorithmMathematicsta112Applied Mathematics010102 general mathematicsRejection samplingErgodicityProbability (math.PR)ta111CovarianceRandom walkMetropolis–Hastings algorithmModeling and SimulationAlgorithmStabilityMathematics - ProbabilityStochastic Processes and their Applications
researchProduct

Bayesian Smoothing in the Estimation of the Pair Potential Function of Gibbs Point Processes

1999

A flexible Bayesian method is suggested for the pair potential estimation with a high-dimensional parameter space. The method is based on a Bayesian smoothing technique, commonly applied in statistical image analysis. For the calculation of the posterior mode estimator a new Monte Carlo algorithm is developed. The method is illustrated through examples with both real and simulated data, and its extension into truly nonparametric pair potential estimation is discussed.

Statistics and ProbabilityMathematical optimizationposterior mode estimatorMarkov chain Monte Carlo methodsMonte Carlo methodBayesian probabilityRejection samplingEstimatorMarkov chain Monte CarloBayesian smoothingGibbs processesHybrid Monte Carlosymbols.namesakeMarquardt algorithmsymbolspair potential functionPair potentialAlgorithmMathematicsGibbs samplingBernoulli
researchProduct

Parsimonious adaptive rejection sampling

2017

Monte Carlo (MC) methods have become very popular in signal processing during the past decades. The adaptive rejection sampling (ARS) algorithms are well-known MC technique which draw efficiently independent samples from univariate target densities. The ARS schemes yield a sequence of proposal functions that converge toward the target, so that the probability of accepting a sample approaches one. However, sampling from the proposal pdf becomes more computationally demanding each time it is updated. We propose the Parsimonious Adaptive Rejection Sampling (PARS) method, where an efficient trade-off between acceptance rate and proposal complexity is obtained. Thus, the resulting algorithm is f…

FOS: Computer and information sciencesSignal processingSequenceComputer science020208 electrical & electronic engineeringMonte Carlo methodRejection samplingUnivariateSampling (statistics)020206 networking & telecommunicationsSample (statistics)02 engineering and technologyStatistics - ComputationAdaptive filter0202 electrical engineering electronic engineering information engineeringElectrical and Electronic EngineeringAlgorithmComputation (stat.CO)Electronics Letters
researchProduct

A new strategy for effective learning in population Monte Carlo sampling

2016

In this work, we focus on advancing the theory and practice of a class of Monte Carlo methods, population Monte Carlo (PMC) sampling, for dealing with inference problems with static parameters. We devise a new method for efficient adaptive learning from past samples and weights to construct improved proposal functions. It is based on assuming that, at each iteration, there is an intermediate target and that this target is gradually getting closer to the true one. Computer simulations show and confirm the improvement of the proposed strategy compared to the traditional PMC method on a simple considered scenario.

Mathematical optimizationComputer scienceMonte Carlo methodInference02 engineering and technology01 natural sciencesHybrid Monte Carlo010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineeringQuasi-Monte Carlo methodKinetic Monte Carlo0101 mathematicsComputingMilieux_MISCELLANEOUSbusiness.industryRejection samplingSampling (statistics)020206 networking & telecommunicationsMarkov chain Monte CarloDynamic Monte Carlo methodsymbolsMonte Carlo integrationMonte Carlo method in statistical physicsArtificial intelligenceParticle filterbusiness[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingMonte Carlo molecular modeling
researchProduct

Grapham: Graphical models with adaptive random walk Metropolis algorithms

2008

Recently developed adaptive Markov chain Monte Carlo (MCMC) methods have been applied successfully to many problems in Bayesian statistics. Grapham is a new open source implementation covering several such methods, with emphasis on graphical models for directed acyclic graphs. The implemented algorithms include the seminal Adaptive Metropolis algorithm adjusting the proposal covariance according to the history of the chain and a Metropolis algorithm adjusting the proposal scale based on the observed acceptance probability. Different variants of the algorithms allow one, for example, to use these two algorithms together, employ delayed rejection and adjust several parameters of the algorithm…

FOS: Computer and information sciencesStatistics and ProbabilityMarkov chainAdaptive algorithmApplied MathematicsRejection samplingMarkov chain Monte CarloMultiple-try MetropolisStatistics - ComputationStatistics::ComputationComputational Mathematicssymbols.namesakeMetropolis–Hastings algorithmComputational Theory and MathematicssymbolsGraphical modelAlgorithmComputation (stat.CO)MathematicsGibbs samplingComputational Statistics & Data Analysis
researchProduct

Contributed discussion on article by Pratola

2016

The author should be commended for his outstanding contribution to the literature on Bayesian regression tree models. The author introduces three innovative sampling approaches which allow for efficient traversal of the model space. In this response, we add a fourth alternative.

Statistics and Probabilitymodel selectionMarkov Chain Monte Carlo (MCMC)Bayesian regression treeComputer scienceBig dataBayesian regression tree (BRT) modelsComputingMilieux_LEGALASPECTSOFCOMPUTINGbirth–death processMachine learningcomputer.software_genreSequential Monte Carlo methods01 natural sciencespopulation Markov chain Monte Carlo010104 statistics & probabilitysymbols.namesakebig data0502 economics and businessBayesian Regression Trees (BART)0101 mathematics050205 econometrics Bayesian treed regressionMultiple Try Metropolis algorithmsINFERÊNCIA ESTATÍSTICAbusiness.industryApplied MathematicsModel selection05 social sciencesRejection samplingData scienceVariable-order Bayesian networkTree (data structure)Tree traversalMarkov chain Monte Carlocontinuous time Markov processsymbolsArtificial intelligencebusinessBayesian linear regressioncommunication-freecomputerGibbs samplingBayesian Analysis
researchProduct

Recycling Gibbs sampling

2017

Gibbs sampling is a well-known Markov chain Monte Carlo (MCMC) algorithm, extensively used in signal processing, machine learning and statistics. The key point for the successful application of the Gibbs sampler is the ability to draw samples from the full-conditional probability density functions efficiently. In the general case this is not possible, so in order to speed up the convergence of the chain, it is required to generate auxiliary samples. However, such intermediate information is finally disregarded. In this work, we show that these auxiliary samples can be recycled within the Gibbs estimators, improving their efficiency with no extra cost. Theoretical and exhaustive numerical co…

Computer scienceMonte Carlo methodSlice samplingMarkov processProbability density function02 engineering and technologyMachine learningcomputer.software_genre01 natural sciencesHybrid Monte Carlo010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineering0101 mathematicsComputingMilieux_MISCELLANEOUSbusiness.industryRejection samplingEstimator020206 networking & telecommunicationsMarkov chain Monte CarlosymbolsArtificial intelligencebusiness[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingcomputerAlgorithmGibbs sampling2017 25th European Signal Processing Conference (EUSIPCO)
researchProduct

Exact simulation of diffusion first exit times: algorithm acceleration

2020

In order to describe or estimate different quantities related to a specific random variable, it is of prime interest to numerically generate such a variate. In specific situations, the exact generation of random variables might be either momentarily unavailable or too expensive in terms of computation time. It therefore needs to be replaced by an approximation procedure. As was previously the case, the ambitious exact simulation of exit times for diffusion processes was unreachable though it concerns many applications in different fields like mathematical finance, neuroscience or reliability. The usual way to describe exit times was to use discretization schemes, that are of course approxim…

[MATH.MATH-PR] Mathematics [math]/Probability [math.PR]Probability (math.PR)primary 65C05 secondary:60G40 68W20 68T05 65C20 91A60 60J60diffusion processes[MATH] Mathematics [math]Exit timeExit time Brownian motion diffusion processes rejection sampling exact simulation multi-armed bandit randomized algorithm.randomized algorithm[MATH.MATH-PR]Mathematics [math]/Probability [math.PR]exact simulationFOS: MathematicsBrownian motionmulti-armed banditMathematics - ProbabilityRejection sampling
researchProduct

Monte-Carlo Methods

2003

The article conbtains sections titled: 1 Introduction and Overview 2 Random-Number Generation 2.1 General Introduction 2.2 Properties That a Random-Number Generator (RNG) Should Have 2.3 Comments about a Few Frequently Used Generators 3 Simple Sampling of Probability Distributions Using Random Numbers 3.1 Numerical Estimation of Known Probability Distributions 3.2 “Importance Sampling” versus “Simple Sampling” 3.3 Monte-Carlo as a Method of Integration 3.4 Infinite Integration Space 3.5 Random Selection of Lattice Sites 3.6 The Self-Avoiding Walk Problem 3.7 Simple Sampling versus Biased Sampling: the Example of SAWs Continued 4 Survey of Applications to Simulation of Transport Processes 4.…

Rejection samplingMonte Carlo methodSlice samplingSampling (statistics)Monte Carlo method in statistical physicsStatistical physicsStatistical mechanicsUmbrella samplingImportance samplingMathematics
researchProduct